Low Discrepancy Sequences and Learning

نویسنده

  • Ram Rajagopal
چکیده

The Discrepancy Method is a constructive method for proving upper bounds that has received a lot of attention in recent years. In this paper we revisit a few important results, and show how it can be applied to problems in Machine Learning such as the Empirical Risk Minimization and Risk Estimation by exploiting connections with combinatorial dimension theory.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Pattern Recognition as a Deterministic Problem: An Approach Based on Discrepancy

When the position of each input vector in the training set is not fixed beforehand, a deterministic approach can be adopted to face with the general problem of learning. In particular, the consistency of the Empirical Risk Minimization (ERM) principle can be established, when the points in the input space are generated through a purely deterministic algorithm (deterministic learning). When the ...

متن کامل

On Atanassov's methods for discrepancy bounds of low-discrepancy sequences

The aim of this survey paper is to give an updated overview of results on low discrepancy sequences obtained via Atanassov’s methods. These methods first initiated for Halton sequences have been later on extended to generalizations of these sequences and to (t, s)sequences, the other well known family of low discrepancy sequences, including polynomial arithmetic analogues of Halton sequences.

متن کامل

Deep Unsupervised Domain Adaptation for Image Classification via Low Rank Representation Learning

Domain adaptation is a powerful technique given a wide amount of labeled data from similar attributes in different domains. In real-world applications, there is a huge number of data but almost more of them are unlabeled. It is effective in image classification where it is expensive and time-consuming to obtain adequate label data. We propose a novel method named DALRRL, which consists of deep ...

متن کامل

A Deterministic Learning Approch Based on Discrepancy

The general problem of reconstructing an unknown function from a finite collection of samples is considered, in case the position of each input vector in the training set is not fixed beforehand, but is part of the learning process. In particular, the consistency of the Empirical Risk Minimization (ERM) principle is analyzed, when the points in the input space are generated by employing a purel...

متن کامل

Regularities of the distribution of abstract van der Corput sequences

Similarly to β-adic van der Corput sequences, abstract van der Corput sequences can be defined by abstract numeration systems. Under some assumptions, these sequences are low discrepancy sequences. The discrepancy function is computed explicitly, and the bounded remainder sets of the form [0, y) are characterized.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2004